Variable Metric Method for Unconstrained Multiobjective Optimization Problems

نویسندگان

چکیده

In this paper, we propose a variable metric method for unconstrained multiobjective optimization problems (MOPs). First, sequence of points is generated using different positive definite matrices in the generic framework. It proved that accumulation are Pareto critical points. Then, without convexity assumption, strong convergence established proposed method. Moreover, use common matrix to approximate Hessian all objective functions, along which new nonmonotone line search technique achieve local superlinear rate. Finally, several numerical results demonstrate effectiveness

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Multiobjective Optimization of Mixed Variable Design Problems

In this paper, a new multiobjective genetic algorithm is employed to support the design of a hydraulic actuation system. First, the proposed method is tested using benchmarks problems gathered from the literature. The method performs well and it is capable of identifying multiple Pareto frontiers in multimodal function spaces. Secondly, the method is applied to a mixed variable design problem w...

متن کامل

Random Perturbation of the Variable Metric Method for Unconstrained Nonsmooth Nonconvex Optimization

We consider the global optimization of a nonsmooth (nondifferentiable) nonconvex real function. We introduce a variable metric descent method adapted to nonsmooth situations, which is modified by the incorporation of suitable random perturbations. Convergence to a global minimum is established and a simple method for the generation of suitable perturbations is introduced. An algorithm is propos...

متن کامل

An Efficient Conjugate Gradient Algorithm for Unconstrained Optimization Problems

In this paper, an efficient conjugate gradient method for unconstrained optimization is introduced. Parameters of the method are obtained by solving an optimization problem, and using a variant of the modified secant condition. The new conjugate gradient parameter benefits from function information as well as gradient information in each iteration. The proposed method has global convergence und...

متن کامل

A Free Line Search Steepest Descent Method for Solving Unconstrained Optimization Problems

In this paper, we solve unconstrained optimization problem using a free line search steepest descent method. First, we propose a double parameter scaled quasi Newton formula for calculating an approximation of the Hessian matrix. The approximation obtained from this formula is a positive definite matrix that is satisfied in the standard secant relation. We also show that the largest eigen value...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of the Operations Research Society of China

سال: 2022

ISSN: ['2194-668X', '2194-6698']

DOI: https://doi.org/10.1007/s40305-022-00447-z